0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (4)
  • R2,500 - R5,000 (4)
  • -
Status
Brand

Showing 1 - 8 of 8 matches in All Departments

An Introduction to Statistical Learning - with Applications in Python (1st ed. 2023): Gareth James, Daniela Witten, Trevor... An Introduction to Statistical Learning - with Applications in Python (1st ed. 2023)
Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani, Jonathan Taylor
R2,956 Discovery Miles 29 560 Ships in 12 - 17 working days

An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance, marketing, and  astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, deep learning, survival analysis, multiple testing, and more. Color graphics and real-world examples are used to illustrate the methods presented. This book is targeted at statisticians and non-statisticians alike, who wish to use cutting-edge statistical learning techniques to analyze their data. Four of the authors co-wrote An Introduction to Statistical Learning, With Applications in R (ISLR), which has become a mainstay of undergraduate and graduate classrooms worldwide, as well as an important reference book for data scientists. One of the keys to its success was that each chapter contains a tutorial on implementing the analyses and methods presented in the R scientific computing environment. However, in recent years Python has become a popular language for data science, and there has been increasing demand for a Python-based alternative to ISLR. Hence, this book (ISLP) covers the same materials as ISLR but with labs implemented in Python. These labs will be useful both for Python novices, as well as experienced users.

An Introduction to Statistical Learning - with Applications in R (Hardcover, 2nd ed. 2021): Gareth James, Daniela Witten,... An Introduction to Statistical Learning - with Applications in R (Hardcover, 2nd ed. 2021)
Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani
R2,280 R2,120 Discovery Miles 21 200 Save R160 (7%) Ships in 9 - 15 working days

An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, deep learning, survival analysis, multiple testing, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra. This Second Edition features new chapters on deep learning, survival analysis, and multiple testing, as well as expanded treatments of naive Bayes, generalized linear models, Bayesian additive regression trees, and matrix completion. R code has been updated throughout to ensure compatibility.

The Elements of Statistical Learning - Data Mining, Inference, and Prediction, Second Edition (Hardcover, 2nd ed. 2009, Corr.... The Elements of Statistical Learning - Data Mining, Inference, and Prediction, Second Edition (Hardcover, 2nd ed. 2009, Corr. 9th printing 2017)
Trevor Hastie, Robert Tibshirani, Jerome Friedman
R1,977 Discovery Miles 19 770 Ships in 12 - 17 working days

During the past decade there has been an explosion in computation and information technology. With it have come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It is a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting---the first comprehensive treatment of this topic in any book.

This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression & path algorithms for the lasso, non-negative matrix factorization, and spectral clustering. There is also a chapter on methods for wide'' data (p bigger than n), including multiple testing and false discovery rates.

Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie co-developed much of the statistical modeling software and environment in R/S-PLUS and invented principal curves and surfaces. Tibshirani proposed the lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, projection pursuit and gradient boosting.

Statistical Learning with Sparsity - The Lasso and Generalizations (Paperback): Trevor Hastie, Robert Tibshirani, Martin... Statistical Learning with Sparsity - The Lasso and Generalizations (Paperback)
Trevor Hastie, Robert Tibshirani, Martin Wainwright
R1,283 Discovery Miles 12 830 Ships in 9 - 15 working days

Discover New Methods for Dealing with High-Dimensional Data A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data. Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of 1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso. In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.

Statistical Learning with Sparsity - The Lasso and Generalizations (Hardcover): Trevor Hastie, Robert Tibshirani, Martin... Statistical Learning with Sparsity - The Lasso and Generalizations (Hardcover)
Trevor Hastie, Robert Tibshirani, Martin Wainwright
R3,196 Discovery Miles 31 960 Ships in 9 - 15 working days

Discover New Methods for Dealing with High-Dimensional Data A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data. Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of 1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso. In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.

An Introduction to Statistical Learning - with Applications in R (Paperback, 2nd ed. 2021): Gareth James, Daniela Witten,... An Introduction to Statistical Learning - with Applications in R (Paperback, 2nd ed. 2021)
Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani
R1,915 Discovery Miles 19 150 Ships in 10 - 15 working days

An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, deep learning, survival analysis, multiple testing, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra. This Second Edition features new chapters on deep learning, survival analysis, and multiple testing, as well as expanded treatments of naive Bayes, generalized linear models, Bayesian additive regression trees, and matrix completion. R code has been updated throughout to ensure compatibility.

The Science of Bradley Efron - Selected Papers (Paperback, Softcover reprint of hardcover 1st ed. 2008): Carl N. Morris, Robert... The Science of Bradley Efron - Selected Papers (Paperback, Softcover reprint of hardcover 1st ed. 2008)
Carl N. Morris, Robert Tibshirani
R3,326 Discovery Miles 33 260 Ships in 10 - 15 working days

Nature didn't design human beings to be statisticians, and in fact our minds are more naturally attuned to spotting the saber-toothed tiger than seeing the jungle he springs from. Yet scienti?c discovery in practice is often more jungle than tiger. Those of us who devote our scienti?c lives to the deep and satisfying subject of statistical inference usually do so in the face of a certain under-appreciation from the public, and also (though less so these days) from the wider scienti?c world. With this in mind, it feels very nice to be over-appreciated for a while, even at the expense of weathering a 70th birthday. (Are we certain that some terrible chronological error hasn't been made?) Carl Morris and Rob Tibshirani, the two colleagues I've worked most closely with, both 't my ideal pro?le of the statistician as a mathematical scientist working seamlessly across wide areas of theory and application. They seem to have chosen the papers here in the same catholic spirit, and then cajoled an all-star cast of statistical savants to comment on them.

The Science of Bradley Efron - Selected Papers (Hardcover, 2008 ed.): Carl N. Morris, Robert Tibshirani The Science of Bradley Efron - Selected Papers (Hardcover, 2008 ed.)
Carl N. Morris, Robert Tibshirani
R4,639 Discovery Miles 46 390 Ships in 10 - 15 working days

Nature didn't design human beings to be statisticians, and in fact our minds are more naturally attuned to spotting the saber-toothed tiger than seeing the jungle he springs from. Yet scienti?c discovery in practice is often more jungle than tiger. Those of us who devote our scienti?c lives to the deep and satisfying subject of statistical inference usually do so in the face of a certain under-appreciation from the public, and also (though less so these days) from the wider scienti?c world. With this in mind, it feels very nice to be over-appreciated for a while, even at the expense of weathering a 70th birthday. (Are we certain that some terrible chronological error hasn't been made?) Carl Morris and Rob Tibshirani, the two colleagues I've worked most closely with, both 't my ideal pro?le of the statistician as a mathematical scientist working seamlessly across wide areas of theory and application. They seem to have chosen the papers here in the same catholic spirit, and then cajoled an all-star cast of statistical savants to comment on them.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Anton Rupert - The Life Of A Business…
Ebbe Dommisse Paperback R395 R289 Discovery Miles 2 890
Dala Craft Pom Poms - Assorted Colours…
R34 Discovery Miles 340
Joseph Joseph Index Mini (Graphite)
R642 Discovery Miles 6 420
Aerolatte Cappuccino Art Stencils (Set…
R110 R95 Discovery Miles 950
American Gods - Season 2
Ricky Whittle, Ian McShane DVD  (1)
R55 Discovery Miles 550
Dunlop Pro High Altitude Squash Ball…
R180 R155 Discovery Miles 1 550
Swiss Miele Vacuum Bags (4 x Bags | 2 x…
 (8)
R199 R166 Discovery Miles 1 660
Bostik Clear on Blister Card (25ml)
R33 Discovery Miles 330
Croxley Create Wood Free Colouring…
R29 Discovery Miles 290
Alva 3-Panel Infrared Radiant Indoor Gas…
R1,499 R1,199 Discovery Miles 11 990

 

Partners